Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Free, publicly-accessible full text available December 1, 2025
-
This paper studies differentially private stochastic convex optimization (DP-SCO) in the presence of heavy-tailed gradients, where only a đ kth-moment bound on sample Lipschitz constants is assumed, instead of a uniform bound. The authors propose a reduction-based approach that achieves the first near-optimal error rates (up to logarithmic factors) in this setting. Specifically, under ( đ , đŋ ) (Īĩ,δ)-approximate differential privacy, they achieve an error bound of đē 2 đ + đē đ â ( đ đ đ ) 1 â 1 đ , n â G 2 â â +G k â â ( nĪĩ d â â ) 1â k 1 â , up to a mild polylogarithmic factor in 1 đŋ δ 1 â , where đē 2 G 2 â and đē đ G k â are the 2nd and đ kth moment bounds on sample Lipschitz constants. This nearly matches the lower bound established by Lowy and Razaviyayn (2023). Beyond the basic result, the authors introduce a suite of private algorithms that further improve performance under additional assumptions: an optimal algorithm under a known-Lipschitz constant, a near-linear time algorithm for smooth functions, and an optimal linear-time algorithm for smooth generalized linear models.more » « less
An official website of the United States government

Full Text Available